To build the figmin-mcp adapter

We publish one distribution to PyPI:

  figmin-mcp: thin stdio adapter that AI clients (Claude Desktop,
              Claude Code, Codex, etc.) spawn. Connects to the
              figmin-bridge daemon over loopback WebSocket on
              127.0.0.1:39573.

There is only one variant — the adapter has no ML dependencies, so it
runs identically on every platform regardless of GPU. There is no CPU
vs GPU split here. Just one pyproject.toml and one PyPI name.

To make a new build and publish:

  1. Bump the `version = "..."` line in figmin-mcp/pyproject.toml.
  2. Delete any existing dist/ folder.
  3. From figmin-mcp/ in cmd.exe:
       uv build
     Output will say `figmin-mcp`.
  4. Then:
       uv publish
     This pushes the wheel + sdist to PyPI.

Pre-publish sanity checks
-------------------------
The adapter's import-hygiene test guards against the most likely
regression — accidentally importing onnxruntime, faster-whisper, numpy,
soundfile, ctranslate2, or any figmin_bridge module. Run it before
publishing:

  cd figmin-mcp
  uv run pytest tests/test_import_hygiene.py -v

If that fails, do not publish — the adapter has accidentally pulled in
ML deps and would force users to download hundreds of MB of unused code
just to use the MCP tools.

Local development without PyPI
------------------------------
Configure the AI client to spawn the adapter directly from the working
tree:

    "figmin-xr": {
      "command": "uv",
      "args": [
        "run",
        "--project",
        "C:\\workspace\\FigminTiltBrush\\WebApp\\mcp\\figmin-mcp",
        "figmin-mcp"
      ],
      "env": {
        "UV_CACHE_DIR": "D:\\uv-cache"
      }
    }

The adapter must always be paired with a running figmin-bridge or
figmin-bridge-gpu daemon — see figmin-bridge/mcp projects/MCP Bridge
build instructions.txt for the daemon's local-dev configuration.
